Wednesday, October 8, 2008

Understanding Sarah Palin: Or, God Is In The Wattles

Here's a question for you. Why hasn't natural selection driven the religious right to extinction?

You should forgive me for asking. After all, here is a group of people who base their lives on patently absurd superstitions that fly in the face of empirical evidence. It's as if I suddenly chose to believe that I could walk off the edges of cliffs with impunity; you would not expect me to live very long. You would expect me to leave few if any offspring. You would expect me to get weeded out.

And yet, this obnoxious coterie of retards — people openly and explicitly contemptuous of "intellectuals" and "evilutionists" and, you know, anyone who actually spends their time learning stuff — they not only refuse to die, they appear to rule the world. Some Alaskan airhead who can't even fake the name of a newspaper, who can't seem to say anything without getting it wrong, who bald-facedly states in a formal debate setting that she's not even going to try to answer questions she finds unpalatable (or she would state as much, if she could say "unpalatable" without tripping over her own tongue) — this person, this behavior, is regarded as successful even by her detractors. The primary reason for her popularity amongst the all-powerful "low-information voters"1? In-your-face religious fundamentalism and an eye tic that would make a Tourette's victim blush.

You might suggest that my analogy is a bit loopy: young-earth creationism may fly in the face of reason, but it hardly has as much immediate survival relevance as my own delusory immunity to gravity. I would disagree. The Christian Church has been an anvil around the neck of scientific progress for centuries. It took the Catholics four hundred years to apologize to Galileo; a hundred fifty for an Anglican middle-management type to admit that they might owe one to Darwin too (although his betters immediately slapped him down for it). Even today, we fight an endless series of skirmishes with fundamentalists who keep trying to sneak creationism in through the back door of science classes across the continent. (I'm given to understand that Islamic fundies are doing pretty much the same thing in Europe.) More people in the US believe in angels than in natural selection. And has anyone not noticed that religious fundamentalists also tend to be climate-change deniers?

Surely, any cancer that attacks the very intellect of a society would put the society itself at a competitive disadvantage. Surely, tribes founded on secular empiricism would develop better technology, better medicines, better hands-on understanding of The Way Things Work, than tribes gripped by primeval cloud-worshipping superstition2. Why, then, are there so few social systems based on empiricism, and why are god-grovellers so powerful across the globe? Why do the Olympians keep getting their asses handed to them by a bunch of intellectual paraplegics?

The great thing about science is, it can even answer ugly questions like this. And a lot of pieces have been falling into place lately. Many of them have to do with the brain's fundamental role as a pattern-matcher.

Let's start with this study here, in the latest issue of Science. It turns out that the less control people feel they have over their lives, the more likely they are to perceive images in random visual static; the more likely they are to see connections and conspiracies in unrelated events. The more powerless you feel, the more likely you'll see faces in the clouds. (Belief in astrology also goes up during times of social stress.)

Some of you may remember that I speculated along such lines back during my rant against that evangelical abortion that Francis Collins wrote while pretending to be a scientist; but thanks to Jennifer Whitson and her buddies, speculation resolves into fact. Obama was dead on the mark when he said that people cling to religion and guns during hard times. The one arises from loss of control, and the other from an attempt to get some back.

Leaving Lepidoptera (please don't touch the displays, little boy, heh heh heh— Oh, cute...) — moving to the next aisle, we have Arachnida, the spiders. And according to findings reported by Douglas Oxley and his colleagues (supplemental material here), right-wingers are significantly more scared of these furry little arthropods than left-wingers tend to be: at least, conservatives show stronger stress responses than liberals to "threatening" pictures of large spiders perched on human faces.

It's not a one-off effect, either. Measured in terms of blink amplitude and skin conductance, the strongest stress responses to a variety of threat stimuli occurred among folks who "favor defense spending, capital punishment, patriotism, and the Iraq War". In contrast, those who "support foreign aid, liberal immigration policies, pacifism, and gun control" tended to be pretty laid-back when confronted with the same stimuli. Oxley et al close off the piece by speculating that differences in political leanings may result from differences in the way the amygdala is wired— and that said wiring, in turn, has a genetic component. The implication is that right-wing/left-wing beliefs may to some extent be hardwired, making them relatively immune to the rules of evidence and reasoned debate. (Again, this is pure speculation. The experiments didn't extend into genetics. But it would explain a lot.)

One cool thing about the aforementioned studies is that they have relatively low sample sizes, both in two-digit range. Any pattern that shows statistical significance in a small sample has got to be pretty damn strong; both of these are.

Now let's go back a ways, to a Cornell Study from 1999 called "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments". It's a depressing study, with depressing findings:
  • People tend to overestimate their own smarts.
  • Stupid people tend to overestimate their smarts more than the truly smart do.
  • Smart people tend to assume that everyone else is as smart as they are; they honestly can't understand why dumber people just don't "get it", because it doesn't occur to them that those people actually are dumb.
  • Stupid people, in contrast, tend to not only regard themselves as smarter than everyone else, they tend to regard truly smart people as especially stupid. This holds true even when these people are shown empirical proof that they are less competent than those they deride.
So. The story so far:
  1. People perceive nonexistent patterns, meanings, and connections in random data when they are stressed, scared, and generally feel a loss of control in their own lives.
  2. Right-wing people are more easily scared/stressed than left-wing people. They are also more likely to cleave to authority figures and protectionist policies. There may be a genetic component to this.
  3. The dumber you are, the less likely you'll be able to recognize your own stupidity, and the lower will be your opinion of people who are smarter than you (even while those people keep treating you as though you are just as smart as they are)
Therefore (I would argue) the so-called "right wing" is especially predisposed to believe in moralizing, authoritarian Invisible Friends. And the dumber individuals (of any stripe) are, the more immune they are to reason. Note that, to paraphrase John Stuart Mill, I am not saying that conservatives are stupid (I myself know some very smart conservatives), but that stupid people tend to be conservative. Whole other thing.

So what we have, so far, is a biological mechanism for the prevalence of religious superstition in right-wing populations. What we need now is a reason why such populations tend to be so damn successful, given the obvious shortcomings of superstition as opposed to empiricism.

Which brings us to Norenzayan and Shariff's review paper in last week's Science on "The Origin and Evolution of Religious Prosociality". To get us in the mood they remind us of several previous studies, a couple of which I may have mentioned here before (at least, I mentioned them somewhere — if they're on the 'crawl, I evidently failed to attach the appropriate "ass-hamsters" tag). For example, it turns out that people are less likely to cheat on an assigned task if the lab tech lets slip that the ghost of a girl who was murdered in this very building was sighted down the hall the other day.

That's right. Plant the thought that some ghost might be watching you, and you become more trustworthy. Even sticking a picture of a pair of eyes on the wall reduces the incidence of cheating, even though no one would consciously mistake a drawing of eyes for the real thing. Merely planting the idea of surveillance seems to be enough to improve one's behavior. (I would also remind you of an earlier crawl entry reporting that so-called "altruistic" acts in our society tend to occur mainly when someone else is watching, although N&S don't cite that study in their review.)

There's also the recent nugget from which this figure was cadged:
This study found not only that religious communes last longer than secular ones, but that even among religious communes the ones that last longest are those with the most onerous, repressive, authoritarian rules.

And so on. Norenzayan and Shariff trot out study after study, addressing a variety of questions that may seem unrelated at first. If, as theorists suggest, human social groupings can only reach 150 members or so before they collapse or fragment from internal stress, why does the real world serve up so many groupings of greater size? (Turns out that the larger the size of a group, the more likely that its members believe in a moralizing, peeping-tom god.) Are religious people more likely than nonreligious ones to help out someone in distress? (Not so much.) What's the most common denominator tying together acts of charity by the religious? (Social optics. "Self-reported belief in God or self-reported religious devotion," the paper remarks wryly, "was not a reliable indicator of generous behavior in anonymous settings.") And why is it that religion seems especially prevalent in areas with chronic water and resource shortages?

It seems to come down to two things: surveillance and freeloading. The surveillance element is pretty self-evident. People engage in goodly behavior primarily to increase their own social status, to make themselves appear more valuable to observers. But by that same token, there's no point in being an upstanding citizen if there are no observers. In anonymous settings, you can cheat.

You can also cheat in nonanonymous settings, if your social group is large enough to get lost in. In small groups, everybody knows your name; if you put out your hand at dinner but couldn't be bothered hunting and gathering, if you sleep soundly at night and never stand guard at the perimeter, it soon becomes clear to everyone that you're a parasite. You'll get the shit kicked out of you, and be banished from the tribe. But as social groupings become larger you lose that everyone-knows-everyone safeguard. You can move from burb to burb, sponging and moving on before anyone gets wise—

unless the costs of joining that community in the first place are so bloody high that it just isn't worth the effort. This is where the onerous, old-testament social rituals come into play.

Norenzayan and Shariff propose that
"the cultural spread of religious prosociality may have promoted stable levels of cooperation in large groups, where reputational and reciprocity incentives are insufficient. If so, then reminders of God may not only reduce cheating, but may also increase generosity toward strangers as much as reminders of secular institutions promoting prosocial behavior."
And they cite their own data to support it. But they also admit that "professions of religious belief can be easily faked", so that
"evolutionary pressures must have favored costly religious commitment, such as ritual participation and various restrictions on behavior, diet, and life-style, that validates the sincerity of otherwise unobservable religious belief."
In other word, anyone can talk the talk. But if you're willing to give all your money to the church and your twelve-year-old daughter to the patriarch, dude, you're obviously one of us.

Truth in Advertising is actually a pretty common phenomenon in nature. Chicken wattles are a case in point; what the hell good are those things, anyway? What do they do? Turns out that they display information about a bird's health, in a relatively unfakeable way. The world is full of creatures who lie about their attributes. Bluegills spread their gill covers when facing off against a competitor; cats go all puffy and arch-backed when getting ready to tussle. Both behaviors serve to make the performer seem larger than he really is— they lie, in other words. Chicken wattles aren't like that; they more honestly reflect the internal state of the animal. It takes metabolic energy to keep them plump and colorful. A rooster loaded down with parasites is a sad thing to see, his wattles all pale and dilapidated; a female can see instantly what kind of shape he's in by looking at those telltales. You might look to the peacock's tail for another example3, or the red ass of a healthy baboon. (We humans have our own telltales— lips, breasts, ripped pecs and triceps— but you haven't been able to count on those ever since implants, steroids, and Revlon came down the pike.) "Religious signaling" appears to be another case in point. As Norenzayan and Shariff point out, "religious groups imposing more costly requirements have members who are more committed." Hence,
"Religious communes were found to outlast those motivated by secular ideologies, such as socialism. … religious communes imposed more than twice as many costly requirements (including food taboos and fasts, constraints on material possessions, marriage, sex, and communication with the outside world) than secular ones… Importantly for costly religious signaling, the number of costly requirements predicted religious commune longevity after the study controlled for population size and income and the year the commune was founded… Finally, religious ideology was no longer a predictor of commune longevity, once the number of costly requirements was statistically controlled, which suggests that the survival advantage of religious communes was due to the greater costly commitment of their members, rather than other aspects of religious ideology."
Reread that last line. It's not the ideology per sé that confers the advantage; it's the cost of the signal that matters. Once again, we strip away the curtain and God stands revealed as ecological energetics, writ in a fancy font.

These findings aren't carved in stone. A lot of the studies are correlational, the models are in their infancy, yadda yadda yadda. But the data are coming in thick and fast, and they point to a pretty plausible model:
  • Fear and stress result in loss of perceived control;
  • Loss of perceived control results in increased perception of nonexistent patterns (N&S again: "The tendency to detect agency in nature likely supplied the cognitive template that supports the pervasive belief in supernatural agents");
  • Those with right-wing political beliefs tend to scare more easily;
  • Authoritarian religious systems based on a snooping, surveillant God, with high membership costs and antipathy towards outsiders, are more cohesive, less invasible by cheaters, and longer-lived. They also tend to flourish in high-stress environments.
And there you have it. The Popular Power of Palin, explained. So the next question is

Now that we can explain the insanity, what are we going to do about it?

Coda 10/10/08: And as the tide turns, and the newsfeeds and Youtube videos pile up on my screen, the feature that distinguishes right from left seems ever-clearer: fear. See the angry mobs at Republican rallies. Listen to the shouts of terrorist and socialist and kill him! whenever Obama's name is mentioned. And just tonight, when even John McCain seemed to realise that things had gone too far, and tried to describe the hated enemy as "a decent man"— he was roundly booed by his own supporters.

How many times have the Dems had their asses handed to them by well-oiled Republican machinery? How many times have the Dems been shot down by the victorious forces of Nixons and Bushes? Were the Democrats ever this bloodthirsty in the face of defeat?

Oxley et al are really on to something. These people are fucking terrified.


Photo credit for Zombie Jesus: no clue. Someone just sent it to me.

1And isn't that a nice CNNism for "moron"? It might seem like a pretty thing veil to you lot, but then again, CNN isn't worried about alienating viewers with higher-than-room-temperature IQs.
2And to all you selfish-gene types out there, where you been? Group-selection is back in vogue this decade. Believe me, I was as surprised as you…
3Although we might be getting into "Handicap Principle" territory here, which is a related but different wattle of fish. I confess I'm not up on the latest trends in this area…

Labels: , , ,

Monday, September 15, 2008

Pedophilia in a Pill

You may remember the case a few years back of the Floridian hypersexual pedophile whose depravity hailed from a brain tumor; the dude (rightly) got off, since he wasn't culpable for the wiring in his head. You may even remember me taking the next step (scroll down to June 30th on the right-hand side), and remarking that the tumor didn't really make a difference— nobody is responsible for the way their heads are wired, and the legal system had taken the first step (again, rightly) towards acknowledging that the very concept of culpability, while convenient, is neurologically unsound.

Exhibit B*: Phillip Carmichael, a former Oxfordshire headmaster and pedophile, exonerated after a court decided that his extensive collection of child porn had been amassed while under the influence of prescription drugs. Once again we see evidence that we are mechanical. The very phrase "control yourself" is dualist at its heart, a logical impossibility. It conjures up images of a driver fighting to stop a careening car with bad brakes. But the fact is, there is no driver. There is only the car— we are the car— and when the brake lines have been cut, careening is just what cars do. Medical professionals prescribed a bunch of pills to this man, and they literally turned him into someone else.

You might think that this would make people feel a bit more kindly towards natural-born kiddy-diddlers. After all, if it's a chemical that turn you into a pervert, you're not really culpable, are you? You're taking the same drugs Carmichael was; the only difference is that they're not being produced by the factory Pharm down the road, they're being produced in your own head. If anything, natural-born pedophiles have even less choice in the matter than did our Exhibit B; at least Carmichael could have chosen more competent medical council.

I would be willing to bet, though, that most people would not think more kindly of pedophiles after performing this thought experiment, and in fact most people would vilify and shout down anyone who dared to make excuses for these monsters. Anything to do with kids is, by definition, a motherhood issue; and motherhood issues by definition turn us into irrational idiots.

But our legal systems generally define culpability in terms of whether offenders know that their acts are against the law, and by that standard I guess some kind of punishment is called for. Still. Let's at least be consistent about it, shall we? We know that a human system called Phillip Carmichael deliberately broke the law; it just wasn't the same Phillip Carmichael who ended up in court after the drugs were withdrawn. That Carmichael had been rebooted back into a benign, Linux sort of personality. The evil child-molesting Microsoft OS had been wiped. So if you want to be consistent about this, put Carmichael back on drugs until the guilty iteration reappears. Then put him in jail.

At least you'd know you have the right guy.


*Thanks to Nas Hedron for the link

Labels: ,

Tuesday, June 17, 2008

I Am Fundamentalist, And So Can You.

This started out as a post about a recently-reported negative correlation between IQ and religious belief (thanks to Craig McGill for the link). It was going to be relatively restrained by local standards; while it's hard to resist the temptation to rub one's hands at yet more evidence that only Stupid People Have Imaginary Friends, I'd have voiced reservations over the unwarranted conflation of "academics" with "smart people" (believe me, there are as many dumb folks in those hallowed halls as there are anywhere else except maybe Fox News); the use of a single score to measure that multifaceted bag of traits we call "intelligence"; even the sloppiness of some of the third-party coverage (this headline, for example, gets the study's findings completely ass-backwards).

In the end, though, I decided to leave those poor bastards alone and come out of the closet myself.

I use the word "fundamentalist" in the sense promoted by Jonathan Rauch: anyone who cannot seriously entertain the possibility that they are wrong about their basic beliefs. It applies pretty obviously to Biblical literalists and their ilk, but the term is not limited to them. It extends to me. I suspect it even extends to the likes of Richard Dawkins, even though he has stated publicly that he would be willing to change his mind on the subject of God. All it would take, he says, is "evidence".

Which is a laudable attitude, and one that reflects the basic difference between science and religion. The question that's been occurring to me lately, though, is, what kind of evidence would it take to turn me into a believer? How much would be enough? God is such an outrageous proposition from so many angles that almost any alternative explanation would be more parsimonious. Mass hallucination. Brain tumors. The Matrix. Aliens with a propensity for juvenile practical jokes.

Imagine a scenario in which the heavens literally opened up, and a Big White Dude with a Mighty Beard and a flotilla of cherubim stroking His Divine Genitalia stared down at me through the clouds and proclaimed in a mighty voice,

I, God, exist! Take your photographs! Run your tricorder!
BELIEVE IN ME!


Would I believe? Fuck no. This has to be some kind of trick. And no matter how much evidence piles up — a smiley face embedded in pi at the thirteen-trillionth decimal place; a cosmological consensus that yes, there's really only one universe, and it really just does happen to be configured with all its physical constants tuned precisely to permit our existence; the literal appearance of the Four Horsemen — all of that, appearing in the face of such astronomically-massive odds, would still have to be weighed against the likelihood of the alternative.

What are the odds that I'm a brain in a tank or a computer simulation, and some bored undergrad is fucking with my sensory inputs? Pretty damn low. What are the odds that an entire physical multiverse was created by means unknown by an omnipotent omniscient sentient entity that exists eternally, without any cause or creator of its own?

Lower. Way lower. (Or at least, that model raises far more numerous and substantive questions than it pretends to answer.)

The bottleneck here is my own mental processes, my own ability to parse input from the outside world, to trust that said input even reflects an objective outside world. The limits are always in me; the brain contains too many tricks and shortcuts to trust implicitly, especially if it serves up something I consider impossible. Whatever input is thrust into my face, hack will always be a more parsimonious explanation than god.

Which leaves me unconvertible, and reduces me to the status of fundamentalist— and Dawkins' grand pronouncement about "evidence" to empty sophistry.

Sucks to introspect.

Labels: ,

Saturday, March 29, 2008

Earth Hour. Because the World Isn't Worth a Whole Day.

Ninety percent of the world's charismatic megafauna is gone. Hormone disrupters are turning the fish off Lakeshore into hermaphrodites, if the tumors don't get them first. The Arctic is heading for ice-free status by 2030, the Wilkins Ice Shelf is a measly six kilometers away from disintegration, air pollution in this miserable dick-ass excuse for a country alone helps kill 16,000 people a year. How do we rise to this challenge? How do we lie in this bed we have made?

Earth Hour. Sixty minutes during which we turn out the lights and pat ourselves on the back for saving the planet. Kings, Corporations, and Communities are all very much on board with this, naturally: in what other context could anyone pose so publicly while actually doing so little? Today's edition of my local Toronto Star is creaming its jeans all over Earth Hour; they're giving it almost as much coverage as can be found in any three pages of the two thick sections they devote daily to selling automobiles. Hundreds, maybe thousands of Torontonians will celebrate the event by climbing into their SUVs and driving out to Downsview Park, there to light candles in the darkness. The Eaton's Center up at Yonge and Dundas is festooned with all sorts of big glossy posters trumpeting their whole-hearted love of Mother Earth. Why, I'll bet the reduced environmental impact from turning off those lights might even recoup a small fraction of the resources consumed to drive the massive multimedia extravaganza advertising Earth Hour.

Oh, wait. There isn't going to be any reduction in environmental impact. Not unless the world's power-generating utilities decide to scale back the fossil fuels they're burning to reflect a one-time, one-hour tick in the time series.

Yes, I know. It's only supposed to make "a statement". It's supposed to be a symbol. And what does it symbolize, exactly? It symbolizes "hope" — which is to say, our infinite capacity for denial, our unwillingness to restrain ourselves in any meaningful sense, our brain-dead refusal to see the brick wall we're hurtling towards. It symbolizes the sick fucking joke that is the human race.

Back in the early nineties I had a girlfriend who volunteered for the Guelph branch of OPIRG. Sick of the flood of smiley-faced books and schizoid puff pieces insisting that being green doesn't mean giving up your second SUV ("And now I sleep just fine at night, knowing that by serving one meat-free meal a week, I'm doing My Part to Save the Planet!"), she proposed countermeasures: a booklet entitled "Fifty Ways to Ease Your Conscience While Continuing to Destroy the Environment." I thought it was a brilliant idea. Everyone at OPIRG absolutely hated it. Too cynical, they said. Too negative. It'll alienate more people than it converts. We must be cheerful. We must be positive.

Evidently this is a fairly common rule among environmental activists afraid of alienating the skittish: No Cynicism. (Which, these days, is tantamount to saying No Cognition...) And so now, after more than a decade of putting on a happy face to keep from scaring the soccer moms, here we are: Earth Hour.

How far we've come.

There was never a time when things could be turned around with such petty gestures. You want to effect real change? You've got to address the root of the problem: human psychology. We evolved in the moment, we evolved to recognize imminent and proximate threats: pestilence, predators, an alpha male coming at us with murder in his eyes. The sight of a rotting corpse or a deformed child makes us squirm; the toothy smile of a great white freezes our blood. But we never evolved to internalize graphs and columns of statistics. They may be real; they just don't feel that way.

They're starting to now, though. Now, even here in the privileged and so-called "developed" world, we're starting to reap what we sow. The outbreaks break out ever-faster, the critters on our doorsteps die in record numbers. But even now, that's just us— and we're not the ones calling the shots. The ones piloting the Titanic are way up in the bridge, isolated, unaffected, never more than a heartbeat from sparkling sands and clean water and the very best in medical care. It's still gonna be a while before the shit piles high enough to matter to them. And so they'll do nothing, because for them the threat is not imminent; and if it is not imminent, neither is it real. So sayeth the Human gut.

So, you want to effect real change? You've got to make the threat matter to the ones who matter. You have to take the shit into their hallways until even they can smell it. You have to threaten something valuable to them, and threaten it now, if you want to awaken that fierce innovative spark of self-preservation that burns brightest when the danger is in your face and the piss is running down your leg.

This is what you'd have to do: hunt down the Harpers and the Gordons and the Martins, the Roves and Cheneys, the Harrises and the Kleins and Bairds. (You might want to hunt down the Dubyas, too— they don't make any of the real decisions, but the symbolism is important.) Dig up the carcass of Dixie Lee Ray while you're at it, and throw its sorry rotten parts into the corral with her living soul mates. (For seasoning, you know.) Hunt down every pundit and commentator who, after years ridiculing the signposts, now shrugs and says Oh, well, I guess we fucked up the planet after all. Too late to fix it now, let's just adapt and make sure that economic growth doesn't drop below five percent... Take every family member who sided with any of them (most have); explain to them all the proximate nature of threat-perception in the human animal, and that you're going to motivate them only way you can.

Then kill half of them. Give the other half a year to fix things. Hold back their families in, as the publishers say, "reasonable amounts against returns".

That's probably what it would take to get these people to give a shit.

Of course, you could never pull it off. All that security, all that well-founded fear of those being governed. And you know, even if the bridge crew did suddenly get serious and try to turn things around, we're still in for a really rough ride. The trajectory of a planetary biosphere is not something you can change on a dime, especially not after the race downhill has been picking up speed for half a century. It's probably too late no matter what we do, unless Venter and Kurzweil turn out to be right.

Still, there's something to be said for simple accountability. And you might even find allies in some pretty unlikely places. Air pollution alone must kill more people in a month than all the serial killers anyone ever sent to the gas chamber; any death-penalty advocate capable of even rudimentary logic would pretty much have to get on board...

Anyway. Pondering such solutions will make my Earth Hour go down a little easier, as I sit here in the dark. I hope it does the same for you.

Labels: , ,

Tuesday, February 26, 2008

Law & Order: Victims of Reality Unit

So Prozac and its ilk prove to be, for the most part, about as clinically effective as a sugar pill. Which kicks loose an idea for a story that's been rattling around in my head for a few years now:

A man diagnosed with terminal cancer is beating the odds with the help of a new drug recently approved by the FDA. The tumors have stabilised, perhaps even receded a little; he has already lived well past his mean life expectancy. It's a breakthrough, a miracle — until a couple of statisticians from John Hopkins publish an analysis proving that the effect is pure placebo. Our patient reads the study. Within a month, he's circling the drain. Within two, he's dead.

The next of kin charge the authors of the paper, and the journal that published them, with negligent homicide.

Placebos work, you see. The brain can do all sorts of things to the body; sometimes it just needs to be tricked into generating the right happy chemicals. Medical professionals know as much: it may not be the cure so much as the belief in the cure that does the trick, and when you shatter that belief, you are knowingly stealing hope and health from every patient who heeds your words. You are, in a very real sense, killing them.

Do we have here a legitimate argument for the perpetuation of ignorance? Medical professionals do not generally discourage the use of prayer in dire circumstance. It does no harm, after all. (Actually there's some evidence that it does do harm; let's set that aside for the moment.) But when you know that placebo effects are real, and you go out of your way to disillusion some deluded flake who shows up on the ward convinced that her crystals and magnets will keep the tumors at bay... well, maybe education of the sick should be a criminal offense.

I'm just saying.

Labels: ,

Saturday, January 5, 2008

Cancer, For the Greater Good

One of my favorite monster movies of all time has got to be John Carpenter's 1982 remake of “The Thing”. It's not a perfect film by any means – there are some gaffes with the rubber fx, and if eighties-era PCs ever came preloaded with software to test whether your buddies had been taken over by shapeshifting aliens, I never saw it. (A shame. Back then I would've bought something like that in a second. These days, it's pretty much redundant.) Still, “The Thing” is one of the few movies of its type in which the Human cast isn't as dumb as a sack of Huckabees. Nobody wanders off after the ship's cat all by themselves. The moment they figure out what they're up against they burn everything that moves, start thinking about serological detection methods, and keep really close watch on each other. It's an effective study in paranoia, starring an alien that not only squirts slime and sprouts tentacles, but actually proves to be quite a bit more intelligent than the humans it preys upon. (As one might expect from a creature with interstellar technology. Am I the only one bothered by the fact that the monster in the Howard Hawkes original never did anything smarter than just kinda flailing around and roaring?) Even at the scorched-earth end of the story, you're never really sure who won.

Then there's the biology.

It's actually not as totally whacked-out as you may think. Granted, anything able to morph between body plans in the time it takes me to snarf an Egg McMuffin would have to have stores of cellular energy verging on the nuclear. (Jeff Goldblum's gradual, donut-powered transformation in “The Fly” was a lot more believable – although why those telepods got all confused at the presence of fly DNA, when they didn't seem to bat a diode at the bacteria seething on every square micron of both fly and human, remains an open question. But I digress.) Still, if you can forgive the ridiculously fast transformation, the idea of an infectious agent retooling infected cells for its own purposes is old news. Viruses have been doing it for billions on years.

Now we are too. Synthetic Life's current rock star, J. Craig Venter, is all over the lit with his artifical chromosomes and Swiss-Army cells: build a cellular chassis that carries the basic instruction set necessary for metabolism, and then top it off with genes to produce whatever you're after this week. Before long, Venter's Vats (and only Venter's vats, if the patents go through) will be churning out great masses of everything from Nutripon to Biogasoline.

But more interesting, to me, is this recent paper out of PloS Computational Biology on “Somatic Evolution”— i.e., the classic Darwinian struggle for existence among the cells of a single tissue in a single organism. And why shouldn't the rules of natural selection apply to cells as well as their owners? The cells in your liver exist in a habitat with limited food, just like populations of multicellular creatures. They jostle up against others like themselves who have their eye on the same nutrients. Given a mutation that allowed one such cell to outcompete its siblings — faster reproductive rate, lower mortality — wouldn't its offspring kick the asses of the competition? Wouldn't the whole tissue, the whole organ, evolve into something new, something where every cell was out for itself, something like —

—Well, cancer, obviously.

Don't bother pointing out the obvious. Yes, if our cells did follow the beat of their own drummer, multicellularity would never have evolved in the first place. But that's circular; there's nothing in the rules that says multicellularity had to evolve, and logically Darwin's hand should be felt down in the blood as well as out on the savannah. Something must have suppressed those processes at the cellular level before metazoans could arise; that's what this paper is about.

But now I'm thinking on a tangent. I remember our old friends the scramblers, and how it was possible for them to evolve without genes:
"I'd swear half the immune system is actively targetting the other half. It's not just the immune system, either. Parts of the nervous system seem to be trying to, well, hack each other. I think they evolve intraorganismally, as insane as that sounds. The whole organism's at war with itself on the tissue level, it's got some kind of cellular Red Queen thing happening. Like setting up a colony of interacting tumors, and counting on fierce competition to keep any one of them from getting out of hand. Seems to serve the same role as sex and mutation does for us."

And I remember MacReady in Carpenter's film, after Norris split into several different pieces to keep from burning alive, internalising the take-home lesson:

"Every part of him was a whole. Every little piece was an individual animal with a built-in desire to protect its own life. Y'see, when a man bleeds... it's just tissue. But blood from one of you things won't obey when it's attacked. It'll try and survive. Crawl away from a hot needle, say..."

Cancer, for the greater good.

Maybe that's where people and scramblers and MacReady-battling Things went their separate ways. We tamed our inner battles using stem cells and transient cells and differentiated tissues, just like Pepper et al. hypothesise. But maybe other worlds spawned other answers. Maybe whatever alien slime mold gave rise to our Antarctic shapeshifter decided to go with the whole cell-competition thing, decided to make it a solution instead of a problem. Maybe that's how all those cells remain totipotent even in the adult animal; or maybe some tentacle-whipping alien J. Craig Venter just figured out how to go back and retrofit his species for optimal adaptability and maximum profit. Of course they could do it, even if they didn't evolve that way. They built flying saucers, for Chrissakes. They were crossing the interstellar gulf before we'd even made it out of Africa. What better failsafe for lost and stranded travellers than to be able to take your cue from the natives, slip into a new body precustomised for its environment?

I read “Who Goes There” back in the eighties, decades after John W. Campbell wrote it and about six months before Carpenter's unjustly-maligned remake hit the theatres. I thought it was tres cool from the outset. But it never occurred to me to write a sequel until I read this paper...

Labels: , , ,

Wednesday, November 21, 2007

The End of Art

This whole stem-cell breakthrough is certainly worth keeping track of, but not here because you know about it already; it's all over other sites far more popular than mine. Ditto the hilarious perspective on WoW which serves as the subject of today's visual aid, starring characters which many of us must know (albeit in roles with more contemporary fashion sense). No, today I'm going to direct your attention to neuroeasthetics, and the following question:

Have you ever seen an ugly fractal?

I haven't. I wouldn't hang every fractal I've ever seen in my living room (even during my Roger Dean phase) — but it wasn't the essential form that turned me off those iterations, it was the color scheme. And such schemes aren't intrinsic to the math; they're arbitrary, a programmer's decision to render this isocline in red and that in blue and not the other way around.

I would argue that fractals, as mathematical entities, are, well, appealing. Aesthetically. All of them. It's something I've batted around with friends and colleagues at least since the mid-eighties, and speaking as a former biologist it has a certain hand-wavey appeal because you can see how an appreciation of fractal geometry might evolve. After all, nature is fractal; and the more fractal a natural environment might be, the greater the diversity of opportunity. An endlessly bifurcating forest; a complex jumble of rocky geometry; a salt plain. Which environments contain more niches, more places to hide, more foraging opportunities, more trophic pathways and redundant backup circuits? Doesn't it make sense that natural selection would reward us for hanging out in complex, high-opportunity environments? Couldn't that explain aesthetics, in the same way that natural selection gave us* rhythm and the orgasm**? Couldn't that explain art?

Maybe. Maybe not. Because firstly (as I'm sure some of you have already chimed in), complex environments also contain more places for predators and competitors to hide and jump out at you. There are costs as well as benefits, and the latter better outweigh the former if fractophilia is going to take hold in the population at large. Also, who says all art is fractal? Sure, landscapes and still lifes. Maybe even those weird cubist and impressionist thingies. But faces aren't fractal; what about portraiture?

The obvious answer is that the recognition and appreciation of faces has got obvious fitness value too, and aesthetics is a big tent; nothing says "art" can't appeal to the fusiform gyrus as well as whatever Mandelbrot Modules we might prove to have. But now along comes this intriguing little paper (update 22/11 — sorry, forgot to add the link yesterday) in Network, which suggests that even though faces themselves are not fractal, artistic renditions of faces are; that artists tend to increase the aesthetic appeal of their portraits by introducing into their work scale-invariant properties that don't exist in the original. Even when dealing with "representational" works, evidently, true art consists of fractalizing the nonfractal.

What we're talking about, folks, may be the end of art as we know it. Go a little further down this road and every mathematician with a graphics tablet will be able to create a visual work that is empirically, demonstrably, beautiful. Personal taste will reduce to measurable variations in aesthetic sensibilities resulting from different lifetime experiences; you will be able to commission a work tweaked to appeal to that precise sensibility. Art will have become a designer drug.

Way back in the early seventies, a story from a guy called Burt Filer appeared in Harlan Ellison's Again, Dangerous Visions. It is called "Eye of the Beholder", and it begins thusly:

THE NEW YORK TIMES, Section 2, Sunday June 3rd by Audrey Keyes. Peter Lukas' long-awaited show opened at the Guggenheim today, and may have shaken confidence in the oldest tenet of art itself: that beauty is in the eye of the beholder. Reactions to his work were uncannily uniform, as if the subjective element had been removed...


Filer wrote his story before anyone even knew what a fractal was. (His guess was that aesthetics could be quantified using derivatives, a miscall that detracts absolutely nothing from the story.) "Beholder" wasn't his first published work; in fact, as far as I can tell, it may have been his last. (That would be fitting indeed.) I don't know if the man's even still alive.

But if you're out there, Burt: dude you called it.


*Well, some of us.
** Ditto.

Labels: , ,

Tuesday, October 9, 2007

The View From The Left

This is an ancient review article — about ten years old, judging by the references — but it contains an intriguing insight from split-brain research that I hadn't encountered before: The right hemisphere remembers stuff with a minimum of elaboration, pretty much as it happens. The left hemisphere makes shit up. Mr. Right just parses things relatively agenda-free, while the left hemisphere tries to force things into context.

The left hemisphere, according to Gazzaniga, looks for patterns. Ol' Lefty's on a quest for meaning.

I learned back in undergrad days that our brains see patterns even where none exist; we're pattern-matching machines, is what we are. But I hadn't realized that such functions were lateralized. This hemispheric specialization strikes me as a little reminiscent of "gene duplication": that process by which genetic replication goes occasionally off the rails and serves up two (or more) copies of a gene where only one had existed before. Which is very useful, because evolution can now play around with one of those copies to its heart's content, and as long as the other retains its original function you don't have to worry about screwing up a vital piece of a working system. (This is something the creationists hope you never learn, since it single-handedly blows their whole the-mousetrap-can't-work-unless-all-the-parts-evolve-simultaneously argument right out of the water.) Analogously, I see one hemisphere experimenting with different functions — imagination, the search for meaning— while the other retains the basic just-the-facts-ma'am approach that traditionally served the organism so well.

Anyway, for whatever reason, we've got a pragmatist hemisphere, and a philosopher hemisphere. Lefty, who imposes patterns even on noise, unsurprisingly turns out to be the source of most false memories. But pattern-matching, the integration of scattered data into cohesive working models of The Way Things Are — that's almost another word for science, isn't it? And a search for deeper meanings, for the reasons behind the way things are — well, that's not exactly formal religion (it doesn't involve parasitic social constructs designed to exploit believers), but it is, perhaps, the religious impulse that formal religion evolved to exploit. Which is getting uncomfortably close to saying that neurologically, the scientific and religious impulses are different facets of the same thing.

Yes, all those mush mouthed self-proclaimed would-be reconcilers have been saying that shit for decades. I still bet you never thought you'd read it here.

But bear with. A compulsion to find meaning and order. When there is a pattern to be found, and enough usable data to parse it, the adaptive significance is obvious: you end up using the stars to predict when the Nile is going to flood its banks. If there is no data, or no pattern, you find it anyway, only it's bogus: thunder comes from Zeus, and Noah surfed a tidal bore that carved out the Grand Canyon in an afternoon. Lefty talks in metaphors sometimes, so even when it gets something right it's not the best at communicating those insights— but that's okay, because Mr. Right is just across the hall, unsullied, unspecialized, picking up the slack.

Only what if, now, we're acquiring data that Mr. Right can't handle? The Human brain is not designed to parse the spaces between galaxies or between quarks. The scales we evolved to handle extend up or down a few orders of magnitude, losing relevance at each iteration. Are things below the Planck length really, empirically more absurd than those at everyday classical scales, or is it just that brains shaped to function at one scale aren't very good at parsing the other?

Maybe this is where Lefty really comes into his own. Like the thermoregulating feather that got press-ganged, fully-formed, into flight duty, perhaps the bogus-pattern-matching, compulsive purpose-seeking, religious wetware of the brain is most suited for finding patterns it once had to invent, back before there were enough data available to justify such cosmological pretzel logic. Perhaps the next stage is to rewire Mr. Right in Lefty's image, turn the whole brain into a lateral-parsing parallel-processor. Perhaps the next stage of scientific enquiry can only be conveyed by speaking in tongues, practiced by colonies of monks whose metaphors must be parsed by the nonconscious modules of Siri Keeton and his synthesist siblinghood. Maybe the future is a fusion of the religious and the empirical.

Of course, the obvious rejoinder is: if all this late-breaking twenty-first-century data is enough to let the religious impulse do something useful for a change, why is it that religious fundamentalists are still such colossal boneheads? Why, if delusion has segued into profound insight, do half the Murricans out there still believe that the universe is six thousand years old? Why do two thirds of them believe in angels?

And the obvious answer is that, appearances notwithstanding, these people are not living in the twenty-first century at all, but the fourteenth. They walk among us locked into a cultural diving bell reeled out along the centuries, hermetically sealed, impervious to any facts or insights more recent than the spheroid Earth (or even older, in the case of at least one ignorant cow on The View). I can only wonder what would happen if somehow that brittle armor were to shatter, if all this real data were to wash over them and somehow penetrate the circuitry that informs their spastic gyrations and religious gibbering. Would they serve up a Theory of Everything? Would the rest of us recognize it if they did?

Probably no, and probably not. It's just idle speculation, smoke blown out my mind's ass. Still. Might be a story in it somewhere: the day when religion subsumed science, and It Was Good.

At least no one could accuse me of getting into a rut.

Labels: , ,

Thursday, September 6, 2007

Do-It-Yourself Zombiehood

New to me, old to the lit: a paper in Trends in Cognitive Sciences, which came out last November (just a month after Blindsight was released): "Attention and consciousness: two distinct brain processes".

Let me cherry-pick a few choice excerpts: "The close relationship between attention and consciousness has led many scholars to conflate these processes." ... "This article ... argu[es] that top-down attention and consciousness are distinct phenomena that need not occur together" ... "events or objects can be attended to without being consciously perceived."

Yes, part of me shouts in vindication, while the rest of me whispers Oh your god, please no.

It's a review article, not original research. As such it cites some of the same studies and examples I drew on while writing Blindsight. But what especially interested me was the suggestion of mechanism behind some of those results. Both Blindsight and Blog cite studies showing that being distracted from a problem actually improves your decision-making skills, or that we are paradoxically better at detecting subtle stimuli in "noisy" environments than in "clean" ones. Koch and Tsuchiya cite a paper that describes this as a form of competition between neuron clusters:
"attention acts as a winner-takes-all, enhancing one coalition of neurons (representing the attended object) at the expense of others (non-attended stimuli). Paradoxically, reducing attention can enhance awareness and certain behaviors."

I like this. It's almost ecological. Predators increase the diversity of their own prey species by keeping the most productive ones in check; remove the starfish from a multispecies intertidal assemblage and the whole neighborhood turns to mussels inside a month. This is the same sort of thing (except it happens within a single brain and therefore tastes more of Lamarck than Darwin). Different functional clusters (the different prey species) duke it out for attention, each containing legitimate data about the environment— but only the winner (i.e., the mussels) gets to tell its tale to the pointy-haired boss. All that other data just gets lost. And the static that paradoxically improves performance in such cases — white noise, or irrelevant anagrams that steal one's focus — play the role of the predator, reducing the advantage of the front-runner so that subordinate subroutines can get their voices heard.

I wonder. If we trained ourselves to live in a state of constant self-imposed distraction, could we desentientise our own brains...?

Labels: , ,

Sunday, August 12, 2007

Selfish Bastards, Every One

Now and then I've fielded questions — in interviews, private e-mails, maybe even here in the 'crawl — about my reductionist take on human nature. In particular, a lot of folks are not comfy with my dissing of altruism, which (if it ever does arise in a population) is likely to get weeded out real fast because Hey, who's going to leave more offspring to the next generation: the selfless doof who gives up his life jacket on the Titanic or the selfish bastard who takes it for himself?

Seems pretty straightforward to me, but it seems to give pause to a lot of folks (I even recently received an e-mail on the subject from the legendary Ted Chiang). What about Mothers who rescue their babies from burning buildings? some of the most egregiously out-clued might ask (A: Kin selection, dummies). What about people who willingly die for their countries or for their religious beliefs? (Yeah, and if Christ had said "Do unto others, turn the other cheek, walk the second mile and in the end you'll go to hell anyway", I'm sure the Christians would've just been lining up to go one-on-one with the lions.) What about people who just act out of the goodness of their hearts and help out those who are not so fortunate, even if they're athiests or unrelated to the beneficiary? (Ah, you mean reciprocal altruism. That's done in expectation of a payoff somewhere down the road— and remind me to scribble a post at some point reviewing what we do to people who accept our kind gestures and then don't reciprocate...)

Yeah, well, um— yeah, what about people who give to panhandlers, or volunteer for good causes even though there's no way some rubby or Malawian foster-child will ever be able to return the favour?

Hmmm.

This last challenge never really shook my position much. I can rattle off "status enhancement/increased mating opportunities" as fast as the next guy. Still, I wasn't aware of any actual studies on humans that backed it up. But now there is one, courtesy of the niggardly cocksuckers at the Journal of Personality and Social Psychology, who — despite my online access privileges as a postdoctoral fellow at a major academic institution — still want to charge me $30 before letting me see any more than the abstract. Screw that. Fortunately there's a layperson-friendly summary in The Economist. So here's the he-said/she-said version:

Men, like most male mammals, like to acquire resources. When they're not especially horny, they're as likely to go for furniture and big-screen TVs — i.e., major, nonportable items that remain in the home — as anything else. When they're horny, however, they'd rather buy bling and fast cars — flashy stuff they can take on the road to attract mates. Also, when in a horny mood, they're more likely to give publicly to panhandlers (also to indulge in risky/heroic behaviour). In other words, both conspicuous consumption and conspicuous generosity are just ways of attracting mates: hey baby, lookit me! I've got so much money I can just give it away!...

Women are no better. They aren't so much into resource acquisition as they are into volunteer work and do-gooding social causes — and once again, when they're not thinking about sex, they don't really care what kind of good they're doing. When horned up, however, women show a distinct preference for conspicuous do-gooding (working in a homeless shelter, for example), while shying away from other kinds (e.g., going off on their own and picking up garbage in a ravine).

So once again, behaviour that seems noble at first glance turns out to be stone self-serving upon closer examination: another brand of faux altruism that has far more in common with peacock's tails and wattles on chickens than with any spark of divine generosity. What's more, the nature of our displays breaks down along the same stereotypic r/K selection lines that have always (understandably) driven feminists up the wall because seriously, who really wants to believe that sex-is-destiny shit anyway?

Not that this should come as news to anyone. (Have any of the men in the audience ever been targeted by a street vendor with an armload of overpriced roses when they weren't in the company of a woman?) Still, it's nice to see actual data backing up the just-so story.

Now, anybody know of any cases of Human altruism that haven't been exposed as kin selection or sleazy get-laid strategies?

Labels: , ,

Tuesday, May 22, 2007

Motherhood Issues

How many times have you heard new parents, their eyes bright with happy delerium (or perhaps just lack of sleep), insisting that you don't know what love is until you first lay eyes on your baby? How many of you have reunited with old university buddies who have grown up and spawned, only to find that mouths which once argued about hyperspace and acid rain can't seem to open now without veering into the realm of child-rearing? How many commercials have you seen that sell steel-belted radials by plunking a baby onto one? How many times has rational discourse been utterly short-circuited the moment someone cries "Please, someone think of the children!"? (I've noticed the aquarium industry is particularly fond of this latter strategy, whenever anyone suggests shutting down their captive whale displays.)

You know all this, of course. You know the wiring and the rationale behind it: the genes build us to protect the datastream. The only reason we exist is to replicate that information and keep it moving into the future. It's a drive as old as life itself. But here's the thing: rutting and reproduction are not the traits we choose to exhalt ourselves for. It's not sprogs, but spirit, that casts us in God's image. What separates us from the beasts of the field is our minds, our intellects. This, we insist, is what makes us truly human.

Which logically means that parents are less human than the rest of us.

Stick with me here. All of us are driven by brainstem imperatives. We are all compromised: none of us is a paragon of intellect or rationality. Still, some are more equal than others. There is a whole set of behavioral subroutines that never run until we've actually pupped, a whole series of sleeper programs that kick in on that fateful moment when we stare into our child's eyes for the first time, hear the weird Middle-eastern Dylan riffs whining in our ears, and realise that holy shit, we're Cylons.

That is the moment when everything changes. Our children become the most important thing in the world, the center of existence. We would save our own and let ten others die, if it came to that. The rational truth of the matter— that we have squeezed out one more large mammal in a population of 6.5 billion, which will in all likelihood accomplish nothing more than play video games, watch Inuit Idol, and live beyond its means until the ceiling crashes in— is something that simply doesn't compute. We look into those bright and greedy eyes and see a world-class athlete, or a Nobel Prize-winner, or the next figurehead of global faux-democracy delivered unto us by Diebold and Halliburton.

We do not see the reality, because seeing reality would compromise genetic imperatives. We become lesser intellects. The parental subroutines kick in and we lose large chunks of the very spark that, by our own lights, makes us human.

So why not recognise that with a new political movement? Call it the "Free Agent Party", and build its guiding principles along the sliding scale of intellectual impairment. Those shackled by addictions that skew the mind — whether pharmaceutically, religiously, or parentally induced — are treated the same way we treat those who have yet to reach the age of majority, and for pretty much the same reasons. Why do we deny driver's licences and voting priveleges to the young? Why do we ban drunks from the driver's seat? Because they are not ready. They are not competent to make reasonable decisions. Nobody questions this in today's society. So tell me, how are offspring addicts any different?

I'm thinking of adding such a political movement to the noisy (and slightly satirical) background of an upcoming novel, but the more I think of it, the more it strikes me as an idea whose time has come. It's a no-lose electoral platform as far as I can see.

Now go find me a campaign manager.

Labels: , ,

Sunday, May 20, 2007

How to Build a Zombie Detector

A fair number of topics jostling for attention lately: slime moulds outfitted with skittish cyborg exoskeletons, Jim Munroe's microbudget megasavvy take on nanotech, even this recent research on free will in fruit flies (which I'm wary of, but am holding off commenting upon until I've thoroughly read the original paper). And I'm in bitten-off-more-than-I-can-chew mode at the moment, so I don't have time to put all that stuff on the crawl right now. But there is one thing that struck me like a bolt from the blue (except it was actually a bolt from an e-mail server) late last night, as I was trying to clear away the e-mail backlog:

Zombie detectors.

There's this guy, allegedly named Nick Alcock, who seems to know way more than he admits to. He first ruined my morning back in March by pointing out that if vampires really needed to eat people because they couldn't synthesise gamma-Protocadherin-Y on their own, and if they needed that protein because it was so damned critical for CNS development, then women shouldn't have working brains because the gene that codes for it is located on the Y chromosome. It was a shot across the bow I could not resist; we're still going at it two months later.

One of the things we've veered into lately is the classic philosopher-wank question: if you've got a nonconscious zombie that natural selection has nonetheless shaped to blend in — to behave as though it were conscious (we're talking the classic philosopher zombie agent here, not the fast killer-zombies under discussion a few days ago) — how could you detect it? More fundamentally, why would you bother? After all, if it behaves exactly like the rest of us, then the fact that it's nonconscious makes no real difference; and if it does behave differently, then consciousness must have some impact on the decision-making process, findings about after-the-fact volition notwithstanding. (The cast of Blindsight mumble about this dilemma near the end of the book; it's basically a variant on the whole "I know I'm conscious but how do I know anyone else is" riff.)

So this Alcock dude points out that if I'm right in my (parroted) claims that consciousness is actually expensive, metabolically, then zombie brains will be firing fewer synapses and burning through less glucose than would a comparable baseline human performing the same mental tasks. And that reminded me of a paper I read a few years back which showed that fast thinkers, hi-IQ types, actually use less of their brains than the unwashed masses; their neural circuitry is more efficient, unnecessary synapses pared away.

Zombie brains run cooler than ours. Even if they mimic our behavior exactly, the computational expense behind that behavior will be lower. You can use an MRI to detect zombies!


Of course, now Nick has turned around and pointed out all the reasons that would never work, because it is his sacred mission in life to never be satisfied. He's pointing out the huge variation in background processing, the miniscule signal one would have to read against that, the impossibility of finding a zombie and a sentry (trademark!) so metabolically identical that you could actually weed out the confounds. I say, fuck that. There are places where the conscious and subconscious minds interface: I say, look at the anterior cingulate gyrus (for example), and don't bother with crude glucose-metabolism/gas-mileage measures. There's gotta be some telltale pattern in there, some trademark spark of lightning that flickers when the pointy-haired boss sends a memo. That's what you look for. The signature of the ball and chain.

Of course, it won't be enough for this Alcock guy. He's bound to find some flaw in that response. He always does.

Maybe I just won't tell him.

Labels: ,